deepfake nude image
UK regulator wants to ban apps that can make deepfake nude images of children
The UK's Children's Commissioner is calling for a ban on AI deepfake apps that create nude or sexual images of children, according to a new report. It states that such "nudification" apps have become so prevalent that many girls have stopped posting photos on social media. And though creating or uploading CSAM images is illegal, apps used to create deepfake nude images are still legal. "Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone -- a stranger, a classmate, or even a friend -- could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps." said Children's Commissioner Dame Rachel de Souza.
- Europe > United Kingdom (0.52)
- North America > United States (0.06)
- North America > Canada (0.06)
- Information Technology > Security & Privacy (1.00)
- Government > Regional Government > Europe Government > United Kingdom Government (0.40)
- Government > Military > Cyberwarfare (0.40)
- Health & Medicine > Therapeutic Area > Psychiatry/Psychology (0.36)
Commissioner calls for ban on apps that make deepfake nude images of children
Artificial intelligence "nudification" apps that create deepfake sexual images of children should be immediately banned, amid growing fears among teenage girls that they could fall victim, the children's commissioner for England is warning. Girls said they were stopping posting images of themselves on social media out of a fear that generative AI tools could be used to digitally remove their clothes or sexualise them, according to the commissioner's report on the tools, drawing on children's experiences. Although it is illegal to create or share a sexually explicit image of a child, the technology enabling them remains legal, the report noted. "Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone – a stranger, a classmate, or even a friend – could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps," the commissioner, Dame Rachel de Souza, said.
- Europe > United Kingdom > England (0.25)
- Oceania > Australia (0.05)
- North America > United States (0.05)
- Information Technology > Security & Privacy (0.68)
- Law > Criminal Law (0.52)
- Health & Medicine > Therapeutic Area > Psychiatry/Psychology (0.30)
AI-powered deepfake nude websites are targeted by San Francisco city attorney's lawsuit
David Chiu announced Thursday that his office is suing the operators of 16 A.I.-powered "undressing" websites that help users create and distribute deepfake nude photos of women and girls. The lawsuit, which city officials said was the first of its kind, accuses the websites' operators of violating state and federal laws that ban deepfake pornography, revenge pornography and child pornography, as well as California's unfair competition law. The names of the sites were redacted in the copy of the suit made public Thursday. Chiu's office has yet to identify the owners of many of the websites, but officials say they hope to find their names and hold them accountable. Chiu said the lawsuit has two goals: shutting down these websites and sounding the alarm about this form of "sexual abuse."
- North America > United States > California > San Francisco County > San Francisco (0.44)
- North America > United States > California > Los Angeles County > Beverly Hills (0.08)
- North America > United States > New Jersey (0.06)
- Law > Litigation (1.00)
- Information Technology > Security & Privacy (0.91)
- Law > Government & the Courts (0.75)
- (2 more...)
A Deepfake Nude Generator Reveals a Chilling Look at Its Victims
As AI-powered image generators have become more accessible, so have websites that digitally remove the clothes of people in photos. One of these sites has an unsettling feature that provides a glimpse of how these apps are used: two feeds of what appear to be photos uploaded by users who want to "nudify" the subjects. The feeds of images are a shocking display of intended victims. WIRED saw some images of girls who were clearly children. Other photos showed adults and had captions indicating that they were female friends or female strangers.